#DeepLearning #LossFunction
Loss Function
- Mean Squared Error (MSE)
- Cross Entropy Loss
Mean Squared Error (MSE):
- Used for regression tasks. Measures the average squared difference between actual values
and predicted values .
Mean Absolute Error (MAE):
- Also used for regression tasks. It measures the average absolute difference between actual and predicted values.
Cross-Entropy Loss (Log Loss):
通常用于处理分类问题
即为分类总数 即为真实类别概率 即为对该类别的预测概率
通常,
, 因为真实值是确定的,可化简得到
[!flex]
Cross Entropy
functionplot--- title: Cross Entropy xLabel: yLabel: bounds: [0,10,-3,1] disableZoom: true grid: true --- g(x)= -log(x)
The derivative of cross entropy
functionplot--- title: The derivative of cross entropy xLabel: yLabel: bounds: [0,10,-1.8,0] disableZoom: true grid: true --- g(x)= -1/x
Hinge Loss:
- Typically used for "maximum-margin" classifiers like Support Vector Machines (SVM).
Huber Loss:
- A loss function used in regression that is less sensitive to outliers than MSE.
Kullback-Leibler (KL) Divergence:
- Measures how one probability distribution
diverges from a second probability distribution . Often used in variational inference or generative models.
Negative Log-Likelihood Loss (NLL):
- Used for classification problems, particularly in models like neural networks with a softmax output.